Gradient-based kernel dimension reduction for regression

نویسندگان

  • Kenji Fukumizu
  • Chenlei Leng
چکیده

This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on an estimator for the gradient of regression function considered for the feature vectors mapped into reproducing kernel Hilbert spaces. It is proved that the method is able to estimate the directions that achieve sufficient dimension reduction. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the ∗The Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-8562 Japan †Department of Statistics, University of Warwick, Coventry, CV4 7AL, UK, and Department of Statistics and Applied Probability, National University of Singapore, 6 Science Drive 2, Singapore, 117546

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gradient-based kernel dimension reduction for supervised learning

This paper proposes a novel kernel approach to linear dimension reduction for supervised learning. The purpose of the dimension reduction is to find directions in the input space to explain the output as effectively as possible. The proposed method uses an estimator for the gradient of regression function, based on the covariance operators on reproducing kernel Hilbert spaces. In comparison wit...

متن کامل

Canonical kernel dimension reduction

A new kernel dimension reduction (KDR) method based on the gradient space of canonical functions is proposed for sufficient dimension reduction (SDR). Similar to existing KDR methods, this new method achieves SDR for arbitrary distributions, but with more flexibility and improved computational efficiency. The choice of loss function in cross-validation is discussed, and a two-stage screening pr...

متن کامل

Protection Scheme of Power Transformer Based on Time–Frequency Analysis and KSIR-SSVM

The aim of this paper is to extend a hybrid protection plan for Power Transformer (PT) based on MRA-KSIR-SSVM. This paper offers a new scheme for protection of power transformers to distinguish internal faults from inrush currents. Some significant characteristics of differential currents in the real PT operating circumstances are extracted. In this paper, Multi Resolution Analysis (MRA) is use...

متن کامل

Sliced Coordinate Analysis for Effective Dimension Reduction and Nonlinear Extensions

Sliced inverse regression (SIR) is an important method for reducing the dimensionality of input variables. Its goal is to estimate the effective dimension reduction directions. In classification settings, SIR is closely related to Fisher discriminant analysis. Motivated by reproducing kernel theory, we propose a notion of nonlinear effective dimension reduction and develop a nonlinear extension...

متن کامل

Gradient-based kernel method for feature extraction and variable selection

We propose a novel kernel approach to dimension reduction for supervised learning: feature extraction and variable selection; the former constructs a small number of features from predictors, and the latter finds a subset of predictors. First, a method of linear feature extraction is proposed using the gradient of regression function, based on the recent development of the kernel method. In com...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013